Absolute exponential stability of recurrent neural networks with Lipschitz-continuous activation functions and time delays

نویسندگان

  • Jinde Cao
  • Jun Wang
چکیده

This paper investigates the absolute exponential stability of a general class of delayed neural networks, which require the activation functions to be partially Lipschitz continuous and monotone nondecreasing only, but not necessarily differentiable or bounded. Three new sufficient conditions are derived to ascertain whether or not the equilibrium points of the delayed neural networks with additively diagonally stable interconnection matrices are absolutely exponentially stable by using delay Halanay-type inequality and Lyapunov function. The stability criteria are also suitable for delayed optimization neural networks and delayed cellular neural networks whose activation functions are often nondifferentiable or unbounded. The results herein answer a question: if a neural network without any delay is absolutely exponentially stable, then under what additional conditions, the neural networks with delay is also absolutely exponentially stable.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Global Exponential Stability of a General Class of Recurrent Neural Networks With Time-Varying Delays

This brief presents new theoretical results on the global exponential stability of neural networks with time-varying delays and Lipschitz continuous activation functions. These results include several sufficient conditions for the global exponential stability of general neural networks with time-varying delays and without monotone, bounded, or continuously differentiable activation function. In...

متن کامل

Absolute exponential stability of a class of continuous-time recurrent neural networks

This paper presents a new result on absolute exponential stability (AEST) of a class of continuous-time recurrent neural networks with locally Lipschitz continuous and monotone nondecreasing activation functions. The additively diagonally stable connection weight matrices are proven to be able to guarantee AEST of the neural networks. The AEST result extends and improves the existing absolute s...

متن کامل

Anti-periodic Solutions for Recurrent Neural Networks without Assuming Global Lipschitz Conditions

In this paper we study recurrent neural networks with time-varying delays and continuously distributed delays. Without assuming global Lipschitz conditions on the activation functions, we establish the existence and local exponential stability of anti-periodic solutions.

متن کامل

Global Stability of a Class of Continuous-Time Recurrent Neural Networks

This paper investigates global asymptotic stability (GAS) and global exponential stability (GES) of a class of continuous-time recurrent neural networks. First, we introduce a necessary and sufficient condition for existence and uniqueness of equilibrium of the neural networks with Lipschitz continuous activation functions. Next, we present two sufficient conditions to ascertain the GAS of the ...

متن کامل

Global asymptotic stability and global exponential stability of continuous-time recurrent neural networks

This note presents new results on global asymptotic stability (GAS) and global exponential stability (GES) of a general class of continuous-time recurrent neural networks with Lipschitz continuous and monotone nondecreasing activation functions. We first give three sufficient conditions for the GAS of neural networks. These testable sufficient conditions differ from and improve upon existing on...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural networks : the official journal of the International Neural Network Society

دوره 17 3  شماره 

صفحات  -

تاریخ انتشار 2004